Conversation
write more tests
plotters example
Fix references to renamed account.
Add indicatif progress bars for examples and increase generation count.
Prevent duplicated neuron inputs
create custom activations example
|
@copilot open a PR and update the activation registry methods based on the comments in this thread |
|
@HyperCodec I've opened a new pull request, #109, to work on those changes. Once the pull request is ready, I'll request review from you. |
Co-authored-by: HyperCodec <72839119+HyperCodec@users.noreply.github.com>
Fix `random_activation_in_scope` panic and simplify implementation
|
all that should be left is bumping genetic-rs to a hotfix version once that's out. |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 15 out of 17 changed files in this pull request and generated 2 comments.
Comments suppressed due to low confidence (2)
src/activation.rs:88
ActivationRegistry::activations_in_scopeusesa.scope.contains(scope)without handlingscope == NeuronScope::NONE. Because every bitflags value containsNONE(0 bits), callingactivations_in_scope(NeuronScope::NONE)will currently return all activations, contradicting the documented meaning ofNONE(“will not be randomly placed anywhere”).
Fix: explicitly return an empty vec for scope == NeuronScope::NONE, or add a guard in the filter (e.g., scope != NONE && a.scope != NONE && a.scope.contains(scope)).
src/activation.rs:206
ActivationFndeserialization panics when an activation name is missing (panic!("Activation function {name} not found")). Panicking insideDeserializemakes malformed/untrusted input able to crash consumers; serde expects this to return aD::Errorinstead.
Fix: replace the panic with Err(serde::de::Error::custom(...)) (and avoid unwrap() by using ok_or_else).
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
@copilot open a new pull request to apply changes based on the comments in this thread |
|
@HyperCodec I've opened a new pull request, #110, to work on those changes. Once the pull request is ready, I'll request review from you. |
Co-authored-by: HyperCodec <72839119+HyperCodec@users.noreply.github.com>
…on panic Co-authored-by: HyperCodec <72839119+HyperCodec@users.noreply.github.com>
Fix ActivationRegistry scope handling and ActivationFn deserialization panic
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 15 out of 17 changed files in this pull request and generated 2 comments.
Comments suppressed due to low confidence (1)
src/activation.rs:29
activation_fn!passesstringify!($F).into()as the activation name, butActivationFn::newtakesname: &'static str. This will not compile (it produces aString). Usestringify!($F)directly (and similarly for the scoped macro arm).
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| rayon = "1.11.0" | ||
| replace_with = "0.1.8" | ||
| serde = { version = "1.0.228", features = ["derive"], optional = true } | ||
| serde-big-array = { version = "0.5.1", optional = true } |
There was a problem hiding this comment.
The crate uses the external rand crate directly (e.g. rand::rng(), rand::random() in examples and src/neuralnet.rs), but rand is not listed in [dependencies]. Transitive deps are not available for rand::... paths, so this will fail to compile. Add an explicit rand dependency (matching the API you use), or update all uses to go through the genetic_rs::prelude::rand re-export instead of rand::....
| serde-big-array = { version = "0.5.1", optional = true } | |
| serde-big-array = { version = "0.5.1", optional = true } | |
| rand = "0.8" |
| if !loc.is_hidden() { | ||
| panic!("cannot remove neurons in input or output layer"); | ||
| } | ||
|
|
There was a problem hiding this comment.
remove_neuron is pub but only assumes the target hidden neuron is hanging (input_count == 0) and does not remove inbound edges. If a caller uses this on a non-hanging neuron, inbound connections will remain and (after index downshifts) can silently retarget a different hidden neuron, corrupting the graph and input_count invariants. Consider enforcing the precondition (e.g., check self[loc].input_count == 0 and panic/return error), or make this API pub(crate) and expose only the safe higher-level removal methods.
| // Enforce the documented precondition: this function must only be used on hanging neurons. | |
| if self[loc].input_count != 0 { | |
| panic!("remove_neuron called on non-hanging neuron; use higher-level removal methods (e.g., remove_connection) instead"); | |
| } |
No description provided.